Expectation propagation on the diluted Bayesian classifier
نویسندگان
چکیده
Efficient feature selection from high-dimensional datasets is a very important challenge in many data-driven fields of science and engineering. We introduce statistical mechanics inspired strategy that addresses the problem sparse context binary classification by leveraging computational scheme known as expectation propagation (EP). The algorithm used order to train continuous-weights perceptron learning rule set (possibly partly mislabeled) examples provided teacher with diluted continuous weights. test method Bayes optimal setting under variety conditions compare it other state-of-the-art algorithms based on message passing maximization approximate inference schemes. Overall, our simulations show EP robust competitive terms variable properties, estimation accuracy complexity, especially when student trained correlated patterns prevent iterative methods converging. Furthermore, numerical tests demonstrate capable online unknown values prior parameters, such dilution level weights fraction mislabeled examples, quite accurately. This achieved means simple maximum likelihood consists minimizing free energy associated algorithm.
منابع مشابه
Expectation Propagation for Bayesian Inference
Expectation Propagation(EP) is one of approaches for approximate inference which first formulated the way we see today at [1] though the idea has roots in many previous works in various areas. It can be considered as a variant of message-passing where each of the individual messages are approximated while being transferred. To introduce EP, it is easier to first start with a couple of approxima...
متن کاملExpectation Propagation for Continuous Time Bayesian Networks
Continuous time Bayesian networks (CTBNs) describe structured stochastic processes with finitely many states that evolve over continuous time. A CTBN is a directed (possibly cyclic) dependency graph over a set of variables, each of which represents a finite state continuous time Markov process whose transition model is a function of its parents. As shown previously, exact inference in CTBNs is ...
متن کاملExpectation Propagation for approximate Bayesian inference
This paper presents a new deterministic approximation technique in Bayesian networks. This method, “Expectation Propagation,” unifies two previous techniques: assumed-density filtering, an extension of the Kalman filter, and loopy belief propagation, an extension of belief propagation in Bayesian networks. Loopy belief propagation, because it propagates exact belief states, is useful for a limi...
متن کاملApproximate Expectation Propagation for Bayesian Inference on Large-scale Problems
where k indexes experimental replicates, i indexes the probe positions, j indexes the binding positions, andN ( jPj aji jjsjbj; i) represents the probability density function of a Gaussian distribution with mean Pj aji jjsjbj and variance i. We assign prior distributions on the binding event bj and the binding strength sj: p(bjj j) = bj j (1 j)1 bj (3) p0(sj) = Gamma(sjjc0; d0) (4) where Gamma(...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physical review
سال: 2021
ISSN: ['0556-2813', '1538-4497', '1089-490X']
DOI: https://doi.org/10.1103/physreve.103.043301